New Active-set Frank-wolfe Variants for Minimization over the Simplex and the `1-ball

نویسندگان

  • ANDREA CRISTOFARI
  • MARIANNA DE SANTIS
  • FRANCESCO RINALDI
چکیده

In this paper, we describe a new active-set algorithmic framework for minimizing a function over the simplex. The method is quite general and encompasses different active-set Frank-Wolfe variants. In particular, we analyze convergence (when using Armijo line search in the calculation of the stepsize) for the active-set versions of standard Frank-Wolfe, away-step Frank-Wolfe and pairwise Frank-Wolfe. Then, we focus on convex optimization problems, and prove that all activeset variants converge at a linear rate under weaker assumptions than the classical counterparts. We further explain how to adapt our framework in order to handle the problem of minimizing a function over the `1-ball. Finally, we report numerical experiments showing the efficiency of the various active-set Frank-Wolfe variants.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

New Active-Set Frank-Wolfe Variants for Minimization over the Simplex and the L1- Ball

In this paper, we are concerned with minimization problems over the unit simplex. Here, we propose the use of an active-set estimate that enables us to define an algorithmic framework where the variables estimated active and those estimated non-active are updated separately at each iteration. In particular, we consider different variants of the Frank-Wolfe direction to be combined with the prop...

متن کامل

Linear Convergence of a Frank-Wolfe Type Algorithm over Trace-Norm Balls

We propose a rank-k variant of the classical Frank-Wolfe algorithm to solve convex optimization over a trace-norm ball. Our algorithm replaces the top singular-vector computation (1-SVD) in Frank-Wolfe with a top-k singular-vector computation (k-SVD), which can be done by repeatedly applying 1-SVD k times. Our algorithm has a linear convergence rate when the objective function is smooth and str...

متن کامل

Frank-Wolfe with Subsampling Oracle

We analyze two novel randomized variants of the Frank-Wolfe (FW) or conditional gradient algorithm. While classical FW algorithms require solving a linear minimization problem over the domain at each iteration, the proposedmethod only requires to solve a linear minimization problem over a small subset of the original domain. The first algorithm that we propose is a randomized variant of the ori...

متن کامل

Private Empirical Risk Minimization Beyond the Worst Case: The Effect of the Constraint Set Geometry

Empirical Risk Minimization (ERM) is a standard technique in machine learning, where a model is selected by minimizing a loss function over constraint set. When the training dataset consists of private information, it is natural to use a differentially private ERM algorithm, and this problem has been the subject of a long line of work [CM08, KST12, JKT12, ST13a, DJW13, JT14, BST14, Ull14]. A pr...

متن کامل

On the Global Linear Convergence of Frank-Wolfe Optimization Variants

The Frank-Wolfe (FW) optimization algorithm has lately re-gained popularity thanks in particular to its ability to nicely handle the structured constraints appearing in machine learning applications. However, its convergence rate is known to be slow (sublinear) when the solution lies at the boundary. A simple lessknown fix is to add the possibility to take ‘away steps’ during optimization, an o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017